AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Two-stage training

# Two-stage training

Kb Whisper Small
Apache-2.0
Whisper model released by the Swedish National Library, optimized for Swedish, trained on 50,000+ hours of Swedish speech data, outperforming the original OpenAI version
Speech Recognition Transformers Other
K
KBLab
28.61k
3
Dragoman
Apache-2.0
Dragoman is a sentence-level English-Ukrainian translation model that employs a two-stage training process, achieving optimal performance with a BLEU score of 32.34 on the FLORES-101 English-Ukrainian development test subset.
Machine Translation Supports Multiple Languages
D
lang-uk
407
12
Ahma 7B
Apache-2.0
Ahma-7B is a 7-billion-parameter decoder-only Transformer model based on Meta Llama(v1) architecture, fully pretrained from scratch using Finnish language.
Large Language Model Transformers Other
A
Finnish-NLP
201
8
Animagine Xl 3.0 Base
Other
Animagine XL 3.0 Base is the foundational version of an advanced anime text-to-image model, focusing on establishing core functionalities and refining key aspects.
Image Generation English
A
cagliostrolab
810
45
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase